As most anyone who works in education will tell you, the use of generative artificial intelligence (GenAI) tools has fundamentally changed aspects of education. While such tools offer possibilities for extending knowledge and developing new research methods, they are very much a multi-edged sword that bring significant challenges. Below we provide some advice explaining why students should be wary of engaging with them.

1. Using material generated by AI without attribution often counts as plagiarism

Many institutions have changed their definition of plagiarism to incorporate the unattributed use of material generated by GenAI. One of our own institutions, for instance, now defines such plagiarism as ‘the act of presenting […] content generated by artificial intelligence (AI) tools as your own without proper acknowledgement.’ Submitting work generated by generative artificial intelligence tools without giving credit would likely mean your work would fall under such a definition.

2. Even if credit is given, you may still end up with a bad mark (or worse)

Some institutions and courses have adopted a confessionary approach to the use of GenAI tools for some assessments, with students encouraged to explain how they have used them. If such tools have been used to help you develop interesting methods or explore large or under-examined datasets, then such practice should certainly be encouraged. However, if the information provided under such confessionary approaches suggests you have essentially got a GenAI tool to write your assessment, you may not get a good grade. Depending on the parameters of GenAI usage allowed, you may also still fall foul of academic integrity processes.

3. Even if you gain a good mark, you have not been through the learning process involved in writing

Writing is an iterative, rather than a linear, process. It leads to learning that occurs as information and ideas about the world are synthesised into narratives that are (hopefully) intelligible to others and (if one is lucky, and often after significant work) reveal new insights. However, if the writing process is largely outsourced then, whatever the grade attained, you may miss out on fully understanding these insights.

4. Childlike inferences

According to Karsten Nohl of SR Labs, GenAI tools ‘are very trusting technology, almost childlike’, as they have a ‘huge memory, but very little in terms of the ability to make judgment calls.’ Expanding on the consequences of this, Nohl explains, ‘[i]f you basically have a child narrating back stuff it heard elsewhere, you need to take that with a pinch of salt.’ Would you want your 9 year old self writing your assessments? If not, then think twice about how you use generative artificial intelligence tools whilst working towards assessments.

5. It makes things up

In a process known as hallucination, GenAI tools often make things up. For a paper titled ‘Assessing ChatGPT as a Tool for Research on US State and Territory Politics’ for Political Studies Review that explores the utility of ChatGPT as a research tool for sub-national US politics, for example, we demonstrated that such hallucinations occurred within profiles on US states and territories that we generated within ChatGPT4 for a project titled 50 States or Bust!

6. Limited datasets

Even the best generative artificial intelligence tools are only as good as the data they are trained on, with some trained on material freely available online. Yet, much material online is locked behind paywalls, and some content created prior to the internet is not online at all. Moreover, there are ongoing lawsuits related to what material such large language models can use and how. As such, before using such tools, ask yourself if you really know what material it was trained on, or understand how this training was done? If the answer to either question is no, then ask yourself whether you should use it to engage in complex processes such as research and writing?

7. Learning is a life-long process, but current GenAI tools will soon be old news

Learning is a life-long process, but technology develops at a breakneck speed. What seems cutting edge today will likely seem outdated very quickly (Friends Reunited anyone?). If you master only the use of the current batch of artificial intelligence tools without understanding the building blocks of knowledge such as research and writing, you leave yourself vulnerable to your skill set becoming obsolete very quickly. Understand and master these key building blocks however, and you have the benefit of a skill set that can be retooled each time technology moves on.

8. Reading is awesome: why leave it to an algorithm?

There are various tools that can be used to summarise text, and these may seem like a tempting short cut. However, if you use them as a substitute for reading, you will have no idea if the summary provided is accurate (and remember to keep those childlike Inferences mentioned above in mind). Moreover, you will miss out on the genuine joy, not to mention learning, that can come from reading.

9. Think of your favourite song: could an algorithm have written it (and would you like it as much if that was the case)?

Irrespective of taste, you likely have a favourite song: a band or a singer that you would stand in line for hours to watch? You might like classical music, sweet soul tracks, synth pop, or you might be a Swiftie. It is likely that at least part of your attraction to this music is the human connection it generates. Would you have the same reaction if it had been fully generated by an algorithm? While there is no right answer to this question, we guess for many the answer is no.

Writing text can be as transformative as making and listening to music. It can, to give just three examples, explore crimes of the powerful, document tragically important history, or help us better understand political ideas. If you use generative artificial intelligence tools to replace the tough, but joyously human, task of crafting prose that capture an event, a period in history, a love affair, or the development of technology, then you miss out on developing your use of language, your understanding of prose, and an evergreen skill that allows you to connect with others.

 

Share.
Disclosure:

About Author

Dr Pete Finn

Dr. Pete Finn, Senior Lecturer in Politics, Faculty of Business and Social Sciences, Kingston University.  Peter Finn is a multi-award-winning Senior Lecturer in Politics at the Faculty of Business and Social Sciences, at Kingston University, London. He is interested in democracy, with a particular focus on elections, national security, and the Official Record. He is co-editor of The Official Record: Oversight, national security and democracy (Manchester University Press, 2024).

Lauren C. Bell

Lauren C. Bell is the inaugural James L. Miller Professor of Political Science at Randolph-Macon College, in Ashland, Virginia. In addition, she currently serves as the college's Associate Provost and Dean of Academic Affairs.

Amy Tatum

Lecturer in Communication and Media at Bournemouth University

Dr Caroline V. Leicht:

Tutor in Media, Culture and Society (Sociological & Cultural Studies) at the University of Glasgow

Comments are closed.